MidiNet: A Convolutional Generative Adversarial Network for Symbolic-Domain Music Generation

نویسندگان

  • Li-Chia Yang
  • Szu-Yu Chou
  • Yi-Hsuan Yang
چکیده

Most existing neural network models for music generation use recurrent neural networks. However, the recent WaveNet model proposed by DeepMind shows that convolutional neural networks (CNNs) can also generate realistic musical waveforms in the audio domain. Following this light, we investigate using CNNs for generating melody (a series of MIDI notes) one bar after another in the symbolic domain. In addition to the generator, we use a discriminator to learn the distributions of melodies, making it a generative adversarial network (GAN). Moreover, we propose a novel conditional mechanism to exploit available prior knowledge, so that the model can generate melodies either from scratch, by following a chord sequence, or by conditioning on the melody of previous bars (e.g. a priming melody), among other possibilities. The resulting model, named MidiNet, can be expanded to generate music with multiple MIDI channels (i.e. tracks). We conduct a user study to compare the melody of eight-bar long generated by MidiNet and by Google’s MelodyRNN models, each time using the same priming melody. Result shows that MidiNet performs comparably with MelodyRNN models in being realistic and pleasant to listen to, yet MidiNet’s melodies are reported to be much more interesting.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Musegan: Demonstration of a Convolutional Gan Based Model for Generating Multi-track Piano-rolls

Generating realistic and aesthetic pieces is one of the most exciting tasks in the field. We present in this demo paper a new neural music generation model we recently proposed, called MuseGAN. We exploit the potential of applying generative adversarial networks (GANs) to generate multi-track pop/rock music of four bars, using convolutions in both the generators and the discriminators. Moreover...

متن کامل

MuseGAN

Generating music has a few notable differences from generating images and videos. First, music is an art of time, necessitating a temporal model. Second, music is usually composed of multiple instruments/tracks, with close interaction with one another. Each track has its own temporal dynamics, but collectively they unfold over time interdependently. Lastly, for symbolic domain music generation,...

متن کامل

Language Generation with Recurrent Generative Adversarial Networks without Pre-training

Generative Adversarial Networks (GANs) have shown great promise recently in image generation. Training GANs for text generation has proven to be more difficult, because of the non-differentiable nature of generating text with recurrent neural networks. Consequently, past work has either resorted to pre-training with maximumlikelihood or used convolutional networks for generation. In this work, ...

متن کامل

Deep Generative Adversarial Neural Networks for Realistic Prostate Lesion MRI Synthesis

Generative Adversarial Neural Networks (GANs) are applied to the synthetic generation of prostate lesion MRI images. GANs have been applied to a variety of natural images, is shown show that the same techniques can be used in the medical domain to create realistic looking synthetic lesion images. 16mm× 16mm patches are extracted from 330 MRI scans from the SPIE ProstateX Challenge 2016 and used...

متن کامل

Improvement of generative adversarial networks for automatic text-to-image generation

This research is related to the use of deep learning tools and image processing technology in the automatic generation of images from text. Previous researches have used one sentence to produce images. In this research, a memory-based hierarchical model is presented that uses three different descriptions that are presented in the form of sentences to produce and improve the image. The proposed ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017